Learning hashing with affinity-based loss functions using auxiliary coordinates
نویسندگان
چکیده
In binary hashing, one wants to learn a function that maps a high-dimensional feature vector to a vector of binary codes, for application to fast image retrieval. This typically results in a difficult optimization problem, nonconvex and nonsmooth, because of the discrete variables involved. Much work has simply relaxed the problem during training, solving a continuous optimization, and truncating the codes a posteriori. This gives reasonable results but is suboptimal. Recent work has applied alternating optimization to the objective over the binary codes and achieved better results, but the hash function was still learned a posteriori, which remains suboptimal. We propose a general framework for learning hash functions using affinity-based loss functions that closes the loop and optimizes jointly over the hash functions and the binary codes. The resulting algorithm can be seen as a corrected, iterated version of the procedure of optimizing first over the codes and then learning the hash function. Compared to this, our optimization is guaranteed to obtain better hash functions while being not much slower, as demonstrated experimentally in various supervised and unsupervised datasets. In addition, the framework facilitates the design of optimization algorithms for arbitrary types of loss and hash functions.
منابع مشابه
Optimizing affinity-based binary hashing using auxiliary coordinates
In supervised binary hashing, one wants to learn a function that maps a highdimensional feature vector to a vector of binary codes, for application to fast image retrieval. This typically results in a difficult optimization problem, nonconvex and nonsmooth, because of the discrete variables involved. Much work has simply relaxed the problem during training, solving a continuous optimization, an...
متن کاملOptimizing binary autoencoders using auxiliary coordinates, with application to learning binary hashing
We consider the problem of binary hashing, where given a high-dimensional vector x ∈ R, we want to map it to an L-bit vector z = h(x) ∈ {0, 1} using a hash function h, while preserving the neighbors of x in the binary space. Binary hashing has emerged in recent years as an effective technique for fast search on image (and other) databases [6]. While the search in the original space would cost O...
متن کاملCompressed Image Hashing using Minimum Magnitude CSLBP
Image hashing allows compression, enhancement or other signal processing operations on digital images which are usually acceptable manipulations. Whereas, cryptographic hash functions are very sensitive to even single bit changes in image. Image hashing is a sum of important quality features in quantized form. In this paper, we proposed a novel image hashing algorithm for authentication which i...
متن کاملDual Projective Hashing and Its Applications - Lossy Trapdoor Functions and More
We introduce the notion of dual projective hashing. This is similar to Cramer-Shoup projective hashing, except that instead of smoothness, which stipulates that the output of the hash function looks random on NO instances, we require invertibility, which stipulates that the output of the hash function on NO instances uniquely determine the hashing key, and moreover, that there is a trapdoor whi...
متن کاملDiscriminative Cross-View Binary Representation Learning
Learning compact representation is vital and challenging for large scale multimedia data. Cross-view/crossmodal hashing for effective binary representation learning has received significant attention with exponentially growing availability of multimedia content. Most existing crossview hashing algorithms emphasize the similarities in individual views, which are then connected via cross-view sim...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1501.05352 شماره
صفحات -
تاریخ انتشار 2015